Riemannian Conjugate Gradient Methods: General Framework and Specific Algorithms with Convergence Analyses

نویسندگان

چکیده

This paper proposes a novel general framework of Riemannian conjugate gradient methods, that is, methods on manifolds. The are important first-order optimization algorithms both in Euclidean spaces and While various types studied spaces, there have been fewer studies those In each iteration the previous search direction must be transported to current tangent space so it can added negative objective function at point. There several approaches transport vector another space. Therefore, more variants than case. order investigate them detail, proposed unifies existing such as ones utilizing or inverse retraction also develops other not covered studies. Furthermore, sufficient conditions for convergence class clarified. Moreover, global properties specific extensively analyzed. analyses provide theoretical results some setting completely new developments algorithms. Numerical experiments performed confirm validity results. compare performances framework.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convergence Properties of Nonlinear Conjugate Gradient Methods

Recently, important contributions on convergence studies of conjugate gradient methods have been made by Gilbert and Nocedal [6]. They introduce a “sufficient descent condition” to establish global convergence results, whereas this condition is not needed in the convergence analyses of Newton and quasi-Newton methods, [6] hints that the sufficient descent condition, which was enforced by their ...

متن کامل

Application of frames in Chebyshev and conjugate gradient methods

‎Given a frame of a separable Hilbert space $H$‎, ‎we present some‎ ‎iterative methods for solving an operator equation $Lu=f$‎, ‎where $L$ is a bounded‎, ‎invertible and symmetric‎ ‎operator on $H$‎. ‎We present some algorithms‎ ‎based on the knowledge of frame bounds‎, ‎Chebyshev method and conjugate gradient method‎, ‎in order to give some‎ ‎approximated solutions to the problem‎. ‎Then we i...

متن کامل

Global Convergence of Conjugate Gradient Methods without Line Search

Global convergence results are derived for well-known conjugate gradient methods in which the line search step is replaced by a step whose length is determined by a formula. The results include the following cases: 1. The Fletcher-Reeves method, the Hestenes-Stiefel method, and the Dai-Yuan method applied to a strongly convex LC objective function; 2. The Polak-Ribière method and the Conjugate ...

متن کامل

Global Convergence Properties of Conjugate Gradient Methods for Optimization

This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher-Reeves method play an important role in the first family, whereas the second family shares an important property with the Polak-Ribir...

متن کامل

Global convergence of two spectral conjugate gradient methods

Two new nonlinear spectral conjugate gradient methods for solving unconstrained optimization problems are proposed. One is based on the Hestenes and Stiefel (HS) method and the spectral conjugate gradient method. The other is based on a mixed spectral HS-CD conjugate gradient method, which combines the advantages of the spectral conjugate gradient method, the HS method, and the CD method. The d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Siam Journal on Optimization

سال: 2022

ISSN: ['1095-7189', '1052-6234']

DOI: https://doi.org/10.1137/21m1464178